1,091 research outputs found

    Ninja data analysis with a detection pipeline based on the Hilbert-Huang Transform

    Full text link
    The Ninja data analysis challenge allowed the study of the sensitivity of data analysis pipelines to binary black hole numerical relativity waveforms in simulated Gaussian noise at the design level of the LIGO observatory and the VIRGO observatory. We analyzed NINJA data with a pipeline based on the Hilbert Huang Transform, utilizing a detection stage and a characterization stage: detection is performed by triggering on excess instantaneous power, characterization is performed by displaying the kernel density enhanced (KD) time-frequency trace of the signal. Using the simulated data based on the two LIGO detectors, we were able to detect 77 signals out of 126 above SNR 5 in coincidence, with 43 missed events characterized by signal to noise ratio SNR less than 10. Characterization of the detected signals revealed the merger part of the waveform in high time and frequency resolution, free from time-frequency uncertainty. We estimated the timelag of the signals between the detectors based on the optimal overlap of the individual KD time-frequency maps, yielding estimates accurate within a fraction of a millisecond for half of the events. A coherent addition of the data sets according to the estimated timelag eventually was used in a characterization of the event.Comment: Accepted for publication in CQG, special issue NRDA proceedings 200

    Potential for grid efficiency based on a combination of leakage reactances of transformers of a transmission interconnecting line: Application of an exhaustive search algorithm

    Get PDF
    Transmission interconnecting lines (called interconnectors in this study) are built to facilitate the exchange of active and reactive power between two areas of a network. Step-up and step-down transformers are required at the ends of the interconnector when interconnectors are at a different voltage, usually higher, than the networks to be connected. A study was carried out to examine the impact on active power losses of a combination of leakage reactances of the transformers at the ends of an interconnector. The study assessed whether combinations can lead to different levels of active power losses and can thus affect the efficiency of the system. It was found that the combinations of reactance have a tangible impact on the power that flows through the interconnector and, consequently, on the sharing of apparent power between the interconnector and the rest of the network. The total active power losses varied appreciably with the various combinations of reactances, resulting in the life-cycle cost of active power losses also varying with the combinations. The study showed that the combination needs to be carefully made, considering that such a choice can have a significant impact on techno-economic aspects of the power system

    Fatty acids in beef from grain- and grass-fed cattle: the unique South African scenario

    Get PDF
    Objective: Different fatty acids elicit different responses in the human body once ingested. Although red meat is often considered to be a source of fatty acids which has a negative impact on human health, many studies have reflected variability in the quantity and quality of fatty acids found in red meat produced on different production systems in different countries. This study evaluated the fatty acid profile of beef, produced by the grass- and grain-fed production systems practised in South Africa.Design: Data are reported as a percentage of lipid per 100 g total fat to enable a comparison with international findings. Furthermore, the findings are translated into edible meat portions, taking fat trimming (often associated with red meat intake) into consideration in order to determine the contribution which the different products can make to the human diet.Subjects and setting: Three cuts of beef from cattle from four production groups were sampled and the fatty acid composition analysed for the meat and fat fractions.Results: Notable differences were found in the quantity and quality of different fatty acids in beef from the different production systems. When untrimmed, no statistically significant difference was found in the total fat between beef produced on the different production systems. Differences became more significant as trimming was performed. When trimmed of all visible fat, beef from young cattle fed according to a grain-based feeding system contained less total fat (6.96 g), and less saturated fat (2.16 g) per 100 g, than beef produced from their grass-fed counterparts (9.77 g and 3.30 g, respectively). There was a more favourable omega-6 to omega-3 fatty acid ratio, i.e. 2.0–2.5:1.0 for grassfed cattle, compared to 8–30:1 for grain-fed cattle, irrespective of the degree of trimming. The beef from the grass-fed cattle also contained a higher quantity of conjugated linoleic acid.Conclusion: A unique classification system for red meat has been implemented in South Africa and dictates the characteristics of the fresh meat that is available to consumers. The results of this study consequently indicate distinctive differences between the fatty acid profile of local red meat and that of beef produce from other countries; often used as a reference for dietary guidance.Keywords: grain fed, grass fed, cattle, fatty acids, red mea

    Black Hole Mergers and Unstable Circular Orbits

    Get PDF
    We describe recent numerical simulations of the merger of a class of equal mass, non-spinning, eccentric binary black hole systems in general relativity. We show that with appropriate fine-tuning of the initial conditions to a region of parameter space we denote the threshold of immediate merger, the binary enters a phase of close interaction in a near-circular orbit, stays there for an amount of time proportional to logarithmic distance from the threshold in parameter space, then either separates or merges to form a single Kerr black hole. To gain a better understanding of this phenomena we study an analogous problem in the evolution of equatorial geodesics about a central Kerr black hole. A similar threshold of capture exists for appropriate classes of initial conditions, and tuning to threshold the geodesics approach one of the unstable circular geodesics of the Kerr spacetime. Remarkably, with a natural mapping of the parameters of the geodesic to that of the equal mass system, the scaling exponent describing the whirl phase of each system turns out to be quite similar. Armed with this lone piece of evidence that an approximate correspondence might exist between near-threshold evolution of geodesics and generic binary mergers, we illustrate how this information can be used to estimate the cross section and energy emitted in the ultra relativistic black hole scattering problem. This could eventually be of use in providing estimates for the related problem of parton collisions at the Large Hadron Collider in extra dimension scenarios where black holes are produced.Comment: 16 pages, 12 figures; updated to coincide with journal versio

    ‘Esprit de corps’: Towards collaborative integration of pharmacists and nurses into antimicrobial stewardship programmes in South Africa

    Get PDF
    With the global threat of antimicrobial resistance now more emergent than ever, there should be wider collaboration between members of the multidisciplinary healthcare team. This article proposes possible ways of engagement between the pharmacist, nurse and doctor. The pharmacist and nurse are placed in an ideal position through united efforts (camaraderie) to redirect healthcare towards improved patient outcomes while also reducing antimicrobial resistance

    Evaluation of the impact of distributed synchronous generation on the stochastic estimation of financial costs of voltage sags

    Get PDF
    Abstract: Power system faults can cause voltage sags that, if they are less than voltage sensitivity threshold of equipment, can lead to interruption of supply and lead to incurring of financial losses. The impact of distributed generation (DG) on these financial losses is investigated in this work. Using the method of fault positions, a stochastic approach to determine voltage sag performance, profiles of magnitudes of remaining voltages at a monitoring point for faults occurring along lines in the network is developed. It follows that an expected number of critical voltage sags at a monitoring point is calculated and the expected cost of these sags is derived for various voltage sensitivity threshold limits. An illustrative study is carried out comparing the expected costs of voltage sags for a network without DG with a DG case, for various mixes of customers. It is shown that in the presence of DG, the expected costs of voltage sags are lesser for all voltage sensitivity criteria assumed and for all customer mixes. The study demonstrates that the impact of incorporating DG sources results in a reduction in the expected cost of voltage sags

    Verifying data for the implementation of the water release module of the WAS program

    Get PDF
    Published ArticleThe Water Administration System (WAS) is designed to be a management tool for irrigation schemes and water control offices that want to manage their water accounts and supply water to clients through canal networks, pipelines and rivers. The ultimate aim of WAS is to optimise irrigation water management and minimise management-related distribution losses in irrigation canals. This research projects focused on the implementation of the water release module of the WAS program at the Vaalharts irrigation scheme. The WAS consists of four modules that are integrated into a single program that can be used on a single PC or a multi-user environment. The four modules are an administration module, a water release module, water accounts module and a water request module. The first three modules are already implemented at Vaalharts, while module four is implemented only partially. This module links with the water request module and calculated water releases for the main canal and all its branches allowing for lag times and any water losses and accruals. To precisely calculate this water release, accurate data is needed to ensure that the correct volume of water is released into the canal network. This can be done by verifying existing data with field data. To optimise the management of the irrigation scheme the fully implemented WAS program need to be installed and running at the scheme. A series of data and calculation verifications need to be executed. The exercise will show the adequacy and correctness of the available database WAS uses to do the release calculation from. This will ensure improved management of the irrigation scheme, catchment and water resource sustainability. It is planned that the information generated from this project will be used in the compilation of integrated catchment management information system, currently underway at the Central university of Technology, Free State, South Africa. It is for this reason that all data should be verified, as trustworthy results and service through management can then be offered to the community and irrigation area

    Catchment management-model evaluation : verifying data for the implementation of the water release module of the WAS program

    Get PDF
    Published ArticleThe Water Administration System (WAS) is designed to be a management tool for irrigation schemes and water offices that want to manage their water accounts and supply to clients through canal networks, pipelines and rivers. The ultimate aim of WAS is to optimize irrigation water management and minimize management-related distribution losses in irrigation canals. This research project focus on the implementation of the water release module of the WAS program at the Vaalharts irrigation scheme. WAS consists of four modules that are integrated into a single program that can be used on a single PC, a PC network system (in use currently at Vaalharts) or a multi-user environment. These modules can be implemented partially or as a whole, depending on the requirements of the specific scheme or office. The four modules are an administration module, a water request module; water accounts module and a water release module. The first three modules are already implemented at Vaalharts, while module four is implemented only partially. This module links with the water request module and calculates water releases for the main canal and all its branches allowing for lag times and any water losses and accruals. Any researcher in this field should first understand where water comes from and how it will be utilized before any calculations are attempted. Only then manipulation of the release volume can commence. To precisely calculate this water release, accurate data is needed to ensure that the correct volume of water is released into the canal network. This can be done by verifying existing data with field data. To optimize the management of the irrigation scheme the fully implemented WAS program need to be installed and running at the scheme. A series of data and calculation verification needs to be executed. The exercise will show the adequacy and correctness of the available database WAS uses to do the release calculation from. This will ensure improved management of the irrigation scheme, catchment and water resource sustainability. It is planned that the information generated from this project will be used in the compilation of an integrated catchment management information system, currently underway in the school of Civil Engineering and Built Environment at the Central University of Technology, Free State, South Africa

    Constraining the evolutionary history of Newton's constant with gravitational wave observations

    Full text link
    Space-borne gravitational wave detectors, such as the proposed Laser Interferometer Space Antenna, are expected to observe black hole coalescences to high redshift and with large signal-to-noise ratios, rendering their gravitational waves ideal probes of fundamental physics. The promotion of Newton's constant to a time-function introduces modifications to the binary's binding energy and the gravitational wave luminosity, leading to corrections in the chirping frequency. Such corrections propagate into the response function and, given a gravitational wave observation, they allow for constraints on the first time-derivative of Newton's constant at the time of merger. We find that space-borne detectors could indeed place interesting constraints on this quantity as a function of sky position and redshift, providing a {\emph{constraint map}} over the entire range of redshifts where binary black hole mergers are expected to occur. A LISA observation of an equal-mass inspiral event with total redshifted mass of 10^5 solar masses for three years should be able to measure GË™/G\dot{G}/G at the time of merger to better than 10^(-11)/yr.Comment: 11 pages, 2 figures, replaced with version accepted for publication in Phys. Rev. D
    • …
    corecore